Maximum likelihood Hebbian rules

نویسندگان

  • Colin Fyfe
  • Emilio Corchado
چکیده

In this paper, we review an extension of the learning rules in a Principal Component Analysis network which has been derived to be optimal for a specific probability density function. We note that this probability density function is one of a family of pdfs and investigate the learning rules formed in order to be optimal for several members of this family. We show that, whereas previous authors [5] have viewed the single member of the family as an extension of PCA, it is more appropriate to view the whole family of learning rules as methods of performing Exploratory Projection Pursuit. We illustrate this on artificial data sets.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimum Entropy Algorithms for Source Separation

The minimum entropy or maximum likelihood estimation can be utilized in blind source separation problem. Based on the local generalized Gaussian probability density model, a set of general anti-Hebbian rule can be derived. This set of adaptation rules give promising results when we test the real recordings.

متن کامل

Maximum Likelihood Hebbian Learning Based Retrieval Method for CBR Systems

CBR systems are normally used to assist experts in the resolution of problems. During the last few years researchers have been working in the development of techniques to automate the reasoning stages identified in this methodology. This paper presents a Maximum Likelihood Hebbian Learning-based method that automates the organisation of cases and the retrieval stage of casebased reasoning syste...

متن کامل

Independent component analysis by general nonlinear Hebbian-like learning rules

A number of neural learning rules have been recently proposed for independent component analysis (ICA). The rules are usually derived from information-theoretic criteria such as maximum entropy or minimum mutual information. In this paper, we show that in fact, ICA can be performed by very simple Hebbian or anti-Hebbian learning rules, which may have only weak relations to such information-theo...

متن کامل

Correlated sequence learning in a network of spiking neurons using maximum likelihood

Hopfield Networks are an idealised model of distributed computation in networks of non-linear, stochastic units. We consider the learning of correlated temporal sequences using Maximum Likelihood, deriving a simple Hebbian-like learning rule that is capable of robustly storing multiple sequences of correlated patterns. We argue that the learning rule is optimal for the case of long temporal seq...

متن کامل

Observer-participant models of neural processing

A model is proposed in which the neuron serves as an information channel. Channel distortion occurs through the channel since the mapping from input Boolean codes to output codes are many-to-one in that neuron outputs consist of just two distinguished states. Within the described model, the neuron performs a decision-making function. Decisions are made regarding the validity of a question passi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002